Defocus can be modeled as a diffusion process and represented mathematically using the heat equation, where image blur corresponds to the diffusion of heat. This analogy can be extended to nonplanar scenes by allowing a space-varying diffusion coefficient. The inverse problem of reconstructing 3D structure from blurred images corresponds to an "inverse diffusion" that is notoriously ill posed. We show how to bypass this problem by using the notion of relative blur. Given two images, within each neighborhood, the amount of diffusion necessary to transform the sharper image into the blurrier one depends on the depth of the scene. This can be used to devise a global algorithm to estimate the depth profile of the scene without recovering the deblurred image using only forward diffusion.