An approximate sparsity model for inpainting

Lixin Shen, Yuesheng Xu, Na Zhang

Research output: Contribution to journalLetter/Newsletterpeer-review

3 Scopus citations


Existing sparse inpainting models often suffer from their over-constraints on the sparsity of the transformed recovered images. Due to the fact that a transformed image of a wavelet or framelet transform is not truly sparse, but approximately sparse, we introduce an approximate sparsity model for inpainting. We formulate the model as minimizing the number of nonzero components of the soft-thresholding operator applied to the transformed image. The key difference of the proposed model from the existing ones is the use of a soft-thresholding operator which shrinkages the components of the transformed image. To efficiently solve the resulting nonconvex optimization problem, we rewrite the ℓ0 norm, which counts the number of nonzero components, as a weighted ℓ1 norm with a nonlinear discontinuous weight function, which is then approximated by a continuous weight function. We overcome the nonlinearity in the weight function by an iteration which leads to a numerical scheme for solving the nonconvex optimization problem. In each iteration, we solve a weighted ℓ1 convex optimization problem. We then focus on understanding the existence of solutions of the weighted ℓ1 convex optimization problem and characterizing them as fixed-points of a nonlinear mapping. The fixed-point formulation allows us to employ efficient iterative algorithms to find the fixed-points. Numerical experiments are shown to demonstrate improvement in performance of the proposed model over the existing models for image inpainting.

Original languageEnglish (US)
Pages (from-to)171-184
Number of pages14
JournalApplied and Computational Harmonic Analysis
Issue number1
StatePublished - Jul 2014


  • Inpainting
  • Sparsity
  • Tight framelet

ASJC Scopus subject areas

  • Applied Mathematics


Dive into the research topics of 'An approximate sparsity model for inpainting'. Together they form a unique fingerprint.

Cite this