Cart (Loading....) | Create Account
Close category search window
 

Tightly-coupled image-aided inertial relative navigation using Statistical Predictive Rendering (SPR) techniques and a priori world Models

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)

Autonomous navigation in areas where Global Positioning System (GPS) solutions are unavailable continues to be a significant challenge. One example application is the relative targeting and navigation problem for next-generation autonomous vehicles. In this application, refined navigation state information (position, velocity, and attitude) can be determined with the addition of a high-resolution camera to an Inertial Navigation System (INS)-aided navigation system. In the proposed employment scenario, GPS information is not available, the location and structure of a reference landmark is known to a high degree of precision, and the initial navigation states (along with their respective uncertainties) of the vehicle are known to a variable degree of uncertainty. The landmark environment is modeled in advance using commercially available Computer Aided Design (CAD) software and photographs of objects within the scene. This information is intended to be combined with INS data in a statistically-rigorous predictive rendering algorithm to determine error states for implementation in an Unscented Kalman Filter (UKF). The error states are then used to correct the navigation solution. Several methods of exploiting the available information are compared to determine “best performers” in terms of speed, precision, and situational appropriateness. For this research, all methods are based on a proposed Statistical Predictive Rendering (SPR) technique which consists of constructing synthetic views of the scene from the perspective of the vehicle for comparison with actual images from the on-board camera. This predictively-rendered image is then compared to measured images using either feature-based or pixel-based comparison methods which serve to improve the accuracy of the correspondence search technique employed. Vision-aided navigation solutions are an active area of research that incorporates knowledge from the estimation, image processing, and navigation fi- - elds of engineering. Past efforts have focused on stochastically constraining feature point correspondence in successive images of the ground from the perspective of an overflying air vehicle using an Extended Kalman Filter (EKF) or UKF, and SPR in the problem of autonomous aerial refueling using an EKF. The proposed algorithm elements are tested using a combination of experimental and simulated data. Currently, the simulated flight profiles show that the navigation solution accuracy and robustness is improved by including SPR-based visual information into the tightly coupled framework. Further experimental tests will be conducted in our laboratory using realistic scenes and in-flight as part of a Test Pilot School project. Conclusions regarding the performance of the tightly-coupled SPR technique will be presented.

Published in:

Position Location and Navigation Symposium (PLANS), 2010 IEEE/ION

Date of Conference:

4-6 May 2010

Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.