Abstract:
Deep learning-based methods have enhanced the performance of many robot applications thanks to their superior ability to robustly extract rich high-dimensional features. ...Show MoreMetadata
Abstract:
Deep learning-based methods have enhanced the performance of many robot applications thanks to their superior ability to robustly extract rich high-dimensional features. However, it comes with a high computational cost that often increases the latency of the overall system. On the other hand, traditional feature extraction methods, such as oriented FAST and rotated BRIEF (ORB), can be computed efficiently and remains the backbone of some critical robot algorithms, e.g. ORBSLAM for robot state estimation. In this work, the usefulness of the aforementioned features for a robot navigation task is investigated. The features are experimentally incorporated with a deep-learning method, called ORB-Net, which allows an agile aerial robot to learn a motion policy to complete a racing track in an autonomous drone racing context. The experimental studies demonstrate that it can be beneficial to reuse these computed features for end-to-end motion planning for the agile quadrotor, as the proposed method using combined input of ORB feature position and RGB images outperforms the baseline methods which use only RGB images.
Date of Conference: 26-27 September 2023
Date Added to IEEE Xplore: 18 December 2023
Print ISBN:978-3-8007-6140-1
Conference Location: Stuttgart, Germany