Loading [MathJax]/extensions/MathMenu.js
Multi-Sensor Fusion and Active Perception for Autonomous Deep Space Navigation | IEEE Conference Publication | IEEE Xplore

Multi-Sensor Fusion and Active Perception for Autonomous Deep Space Navigation


Abstract:

Keeping track of the current state is a crucial task for mobile autonomous systems, which is referred to as state estimation. To solve that task, information from all ava...Show More

Abstract:

Keeping track of the current state is a crucial task for mobile autonomous systems, which is referred to as state estimation. To solve that task, information from all available sensors needs to be fused, which includes relative measurements as well as observations of the surroundings. In a dynamic 3D environment, the pose of an agent has to be chosen such that the most relevant information can be observed. We propose an approach for multi-sensor fusion and active perception within an autonomous deep space navigation scenario. The probabilistic modeling of observables and sensors for that particular domain is described. For state estimation, we present an Extended Kalman Filter, an Unscented Kalman Filter, and a Particle Filter, which all operate on a manifold state space. Additionally, an approach for active perception is proposed, which selects the desired attitude of the spacecraft based on the knowledge about the dynamics of celestial objects, the kind of information they provide as well as the current uncertainty of the filters. We evaluated the localization performance of the algorithms within a simulation environment. The filters are compared to each other and we show that our active perception strategy outperforms two other information intake approaches.
Date of Conference: 10-13 July 2018
Date Added to IEEE Xplore: 06 September 2018
ISBN Information:
Conference Location: Cambridge, UK

Contact IEEE to Subscribe

References

References is not available for this document.