Abstract
Spatio-temporal irradiance variations are created by some structured light setups. They also occur naturally underwater, where they are termed flicker. Underwater, visibility is also affected by water scattering. Methods for overcoming or exploiting flicker or scatter exist, when the imaging geometry is static or quasi-static. This work removes the need for quasi-static scene-object geometry under flickering illumination. A scene is observed from a free moving platform that carries standard frame-rate stereo cameras. The 3D scene structure is illumination invariant. Thus, as a reference for motion estimation, we use projections of stereoscopic range maps, rather than object radiance. Consequently, each object point can be tracked and then filtered in time, yielding deflickered videos. Moreover, since objects are viewed from different distances as the stereo rig moves, scattering effects on the images are modulated. This modulation, the recovered camera poses, 3D structure and de-flickered images yield inversion of scattering and recovery of the water attenuation coefficient. Thus, coupled difficult problems are solved in a single framework. This is demonstrated in underwater field experiments and in a lab.