Directed Real-World Learned Exploration | IEEE Conference Publication | IEEE Xplore

Abstract:

Automated Guided Vehicles (AGV) are omnipresent, and are able to carry out various kind of preprogrammed tasks. Unfortunately, a lot of manual configuration is still requ...Show More

Abstract:

Automated Guided Vehicles (AGV) are omnipresent, and are able to carry out various kind of preprogrammed tasks. Unfortunately, a lot of manual configuration is still required in order to make these systems operational, and configuration needs to be re-done when the environment or task is changed. As an alternative to current inflexible methods, we employ a learning based method in order to perform directed exploration of a previously unseen environment. Instead of relying on handcrafted heuristic representations, the agent learns its own environmental representation through its embodiment. Our method offers loose coupling between the Reinforcement Learning (RL) agent, which is trained in simulation, and a separate, on real-world images trained task module. The uncertainty of the task module is used to direct the exploration behavior. As an example, we use a warehouse inventory task, and we show how directed exploration can improve the task performance through active data collection. We also propose a novel environment representation to efficiently tackle the sim2real gap in both sensing and actuation. We empirically evaluate the approach both in simulated environments and a real-world warehouse.
Date of Conference: 01-05 October 2023
Date Added to IEEE Xplore: 13 December 2023
ISBN Information:

ISSN Information:

Conference Location: Detroit, MI, USA

Contact IEEE to Subscribe

References

References is not available for this document.