Proc. IEEE International Conference on Image Processing,
Sep. 11-14, 2011,
Brussels, Belgium.
Patch-based Image Deconvolution Via Joint Modeling Of Sparse Priors
Chao Jia
and
Brian L. Evans
Department of Electrical
and Computer Engineering,
Engineering Science Building,
The University of Texas at Austin,
Austin, TX 78712 USA
kurtjc@gmail.com -
bevans@ece.utexas.edu
Paper -
Slides -
Software
The software is available in a rar format, which can be decompressed using
7-zip or other similar software.
Abstract
Image deconvolution aims to recover an image that has been degraded
by a linear operation such as blurring during image acquisition.
Deconvolution based on maximum a-posteriori (MAP) estimation
requires the global prior probability of the original image.
Conventional methods usually model the image priors by uniformly
characterizing the statistical properties of either some forward
measurements of images or the representation coefficients in frames,
neglecting the local image statistics.
In this paper, we adopt local sparse representation in image
deconvolution.
Our contributions include proposing
- a joint model of natural images combining sparse representation
of image patches and sparse gradient priors, and
- an efficient iterative algorithm to infer the MAP estimate of
image deconvolution using the proposed model.
Experiments indicate that the proposed method can recover the original
image with high peak signal-to-noise ratio (PSNR) and structural
similarity (SSIM) index compared with state-of-the-art methods.
Figure 1: Visual comparison of House image in test setting 1.
original
blurred
L0-ABS
proposed
COPYRIGHT NOTICE: All the documents on this server
have been submitted by their authors to scholarly journals or conferences
as indicated, for the purpose of non-commercial dissemination of
scientific work.
The manuscripts are put on-line to facilitate this purpose.
These manuscripts are copyrighted by the authors or the journals in which
they were published.
You may copy a manuscript for scholarly, non-commercial purposes, such
as research or instruction, provided that you agree to respect these
copyrights.
Last Updated 09/10/11.