Skip to Main Content
Tracking targets in forward-looking infrared (FLIR) video sequences taken from airborne platforms is a challenging task. Several tracking failure modes can occur; in particular, discontinuities due to platform's motion can produce the so called ego-motion failure leading to unrecoverable errors in tracking the target. A novel ego-motion compensation technique for UAVs (unmanned aerial vehicles) is proposed. Data received from the autopilot can be used to predict the motion of the platform, thus allowing to identify a smaller region of the image (subframe) where the candidate target has to be searched for in the next frame of the sequence. The presented methodology is compared with a recently robust algorithm for automatic target tracking; experimental results show that the proposed motion estimation approach helps to improve performance both in terms of frames processed per second (targets are searched in smaller regions) and in terms of robustness (targets are correctly tracked for all the sequence's frames).