Loading [MathJax]/extensions/MathMenu.js
Perception-aware Full Body Trajectory Planning for Autonomous Systems using Motion Primitives | IEEE Conference Publication | IEEE Xplore

Perception-aware Full Body Trajectory Planning for Autonomous Systems using Motion Primitives


Abstract:

Many robotic systems rely on visual sensing to accomplish simultaneously the tasks of state estimation, mapping, and path planning. One one hand, the usage of camera sens...Show More

Abstract:

Many robotic systems rely on visual sensing to accomplish simultaneously the tasks of state estimation, mapping, and path planning. One one hand, the usage of camera sensors represents a power-efficient and lightweight option for solving this problem. On the other hand, these tasks pose requirements on the quality of the visual input (e.g. number of tracked features for Visual Odometry) that are often in contrast to the optimal viewpoint planning for local mapping and obstacle avoidance. Dealing with this constraint is actively researched in the field of perception-aware planning. The approaches delivered by this field mostly concern Micro air vehicles (MAVs), but could be applied to a larger group of robotic systems. We propose a perception-aware trajectory planner for a class of robotic systems that can orient their cameras independently from their direction of travel. By using motion primitives, our planner does not require differentiable models for motion and perception objectives. We evaluate our method in simulation, showing increased capabilities in localization-aware motions around obstacles, and demonstrate its run-time capability on a real planetary rover. The code is released publicly under github.com/DLR-RM/palp.
Date of Conference: 14-18 October 2024
Date Added to IEEE Xplore: 25 December 2024
ISBN Information:

ISSN Information:

Conference Location: Abu Dhabi, United Arab Emirates

Contact IEEE to Subscribe

References

References is not available for this document.