By Topic

A system and method for auto-correction of first order lens distortion

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Fry, J. ; Integrated Design Services, Pennsylvania State Univ., University Park, PA, USA ; Pusateri, M.

In multispectral imaging systems, correction for lens distortion is required to allow pixel by pixel fusion techniques to be applied. While correction of optical aberration can be extended to higher order terms, for many systems, a first order correction is sufficient to achieve desired results. In producing a multispectral imaging system in production quantities, the process of producing the corrections needs to be largely automated as each lens will require its own corrections. We discuss an auto-correction and bench sighting method application to a dual band imaging system. In principle, we wish to image a dual band target and completely determine the lens distortion parameters for the given optics. We begin with a scale-preserving, radial, first-order lens distortion model; this model allows the horizontal field of view to be determined independently of the distortion. It has the benefits of simple parameterization and the ability to correct mild to moderate distortion that may be expected of production optics. The correction process starts with imaging a dual band target. A feature extraction algorithm is applied to the imagery from both bands to generate a large number of correlated feature points. Using the feature points, we derive an over-determined system of equations; the solution to this system yields the distortion parameters for the lens. Using these parameters, an interpolation map can be generated unique to the lenses involved. The interpolation map is used in real-time to correct the distortion while preserving the horizontal field of view constraint on the system.

Published in:

Applied Imagery Pattern Recognition Workshop (AIPR), 2010 IEEE 39th

Date of Conference:

13-15 Oct. 2010