By Topic

Turbid Scene Enhancement Using Multi-Directional Illumination Fusion

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Treibitz, T. ; Dept. of Comput. Sci. & Eng., Univ. of California, San Diego, CA, USA ; Schechner, Y.Y.

Ambient light is strongly attenuated in turbid media. Moreover, natural light is often more highly attenuated in some spectral bands, relative to others. Hence, imaging in turbid media often relies heavily on artificial sources for illumination. Scenes irradiated by an off-axis single point source have enhanced local object shadow edges, which may increase object visibility. However, the images may suffer from severe nonuniformity, regions of low signal (being distant from the source), and regions of strong backscatter. On the other hand, simultaneously illuminating the scene from multiple directions increases the backscatter and fills-in shadows, both of which degrade local contrast. Some previous methods tackle backscatter by scanning the scene, either temporally or spatially, requiring a large number of frames. We suggest using a few frames, in each of which wide field scene irradiance originates from a different direction. This way, shadow contrast can be maintained and backscatter can be minimized in each frame, while the sequence at large has a wider, more spatially uniform illumination. The frames are then fused by post processing to a single, clearer image. We demonstrate significant visibility enhancement underwater using as little as two frames.

Published in:

Image Processing, IEEE Transactions on  (Volume:21 ,  Issue: 11 )