Abstract:
Autonomous navigation at high speeds in off-road environments necessitates robots to comprehensively understand their surroundings using onboard sensing only. The extreme...Show MoreMetadata
Abstract:
Autonomous navigation at high speeds in off-road environments necessitates robots to comprehensively understand their surroundings using onboard sensing only. The extreme conditions posed by the off-road setting can cause degraded image quality as well as limited sparse geometric information available from light detection and ranging (LiDAR) sensing when driving at high speeds. In this work, we present RoadRunner, a novel framework capable of predicting terrain traversability and elevation directly from camera and LiDAR sensor inputs. RoadRunner enables reliable autonomous navigation by fusing sensory information and generates contextually informed predictions about the geometry and traversability of the terrain while operating at low latency. In contrast to existing methods, which rely on classifying handcrafted semantic classes and using heuristics to predict traversability costs, our method directly predicts traversability. It is trained on labels that can be automatically generated in hindsight in a self-supervised fashion. The RoadRunner network architecture builds upon advances from the autonomous driving domain, which allow us to embed LiDAR and camera information into a common bird’s eye-view perspective. Training is enabled by utilizing an existing traversability estimation stack to generate training data in hindsight in a scalable manner from real-world off-road driving datasets. Furthermore, RoadRunner improves the system latency by a factor of ~4, from 500 to 140 ms, while improving the accuracy for traversability costs and elevation map predictions. We demonstrate the effectiveness of RoadRunner in enabling safe and reliable off-road navigation at high speeds in multiple real-world driving scenarios through unstructured desert environments.
Published in: IEEE Transactions on Field Robotics ( Volume: 1)