Skip to Main Content
While a global positioning system (GPS) is the most widely used sensor modality for aircraft navigation, researchers have been motivated to investigate other navigational sensor modalities because of the desire to operate in GPS denied environments. Due to advances in computer vision and control theory, monocular camera systems have received growing interest as an alternative/collaborative sensor to GPS systems. Cameras can act as navigational sensors by detecting and tracking feature points in an image. One limiting factor in this method is the current inability to relate feature points as they enter and leave the camera field of view. This paper continues research efforts to provide a vision- based position estimation method for aircraft guidance. A recently developed estimation method is integrated with a new, nonlinear flight model of a aircraft. The vision-based estimation scheme provides input directly to the vehicle guidance system and autopilot.