Feature-driven exemplar-based nonlocal image inpainting

Image inpainting
Result of the proposed inpainting algorithm


We presented an image inpainting model which is free to depend on a variety of features. The model is capable of propagating structure within the inpainting region as well as exploiting self-similarity in intact portions of the image to recover texture.

Significance and Impact

Compressive sensing, super-resolution, inpainting and denoising are the most common inverse problems of image processing. Inpainting is a process through which lost, deteriorated, or undesirable portions of images and videos may be replaced with a visually plausible interpolation. The applications are plentiful. For example, it is a core component of the analysis of experimental data which is often corrupted by noise and other measurement artifacts. Data post-processing with automatic artifact detection and removal is a necessary step to extract useful information from such corrupted datasets.

Research Details

  • proposed a novel nonlocal variational model and efficient algorithm for image inpainting
  • created an easy to use software implementing the algorithm


We present a nonlocal variational image completion technique which admits simultaneous inpainting of multiple structures and textures in a unified framework. The recovery of geometric structures is achieved by using general convolution operators as a measure of behavior within an image. These are combined with a nonlocal exemplar-based approach to exploit the self-similarity of an image in the selected feature domains and to ensure the inpainting of textures. We also introduce an anisotropic patch distance metric to allow for better control of the feature selection within an image and present a nonlocal energy functional based on this metric. Finally, we derive an optimization algorithm for the proposed variational model and examine its validity experimentally with various test images.





Last Updated: January 14, 2021 - 11:14 am