Category-Specific Object Image Denoising

Saeed Anwar*, Fatih Porikli, Cong Phuoc Huynh

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    25 Citations (Scopus)

    Abstract

    We present a novel image denoising algorithm that uses external, category specific image database. In contrast to existing noisy image restoration algorithms that search patches either from a generic database or noisy image itself, our method first selects clean images similar to the noisy image from a database that consists of images of the same class. Then, within the spatial locality of each noisy patch, it assembles a set of 'support patches' from the selected images. These noisy-free support samples resemble the noisy patch and correspond principally to the identical part of the depicted object. In addition, we employ a content adaptive distribution model for each patch, where we derive the parameters of the distribution from the support patches. We formulate noise removal task as an optimization problem in the transform domain. Our objective function composed of a Gaussian fidelity term that imposes category specific information, and a low-rank term that encourages the similarity between the noisy and the support patches in a robust manner. The denoising process is driven by an iterative selection of support patches and optimization of the objective function. Our extensive experiments on five different object categories confirm the benefit of incorporating category-specific information to noise removal and demonstrate the superior performance of our method over the state-of-the-art alternatives.

    Original languageEnglish
    Article number7997759
    Pages (from-to)5506-5518
    Number of pages13
    JournalIEEE Transactions on Image Processing
    Volume26
    Issue number11
    DOIs
    Publication statusPublished - Nov 2017

    Fingerprint

    Dive into the research topics of 'Category-Specific Object Image Denoising'. Together they form a unique fingerprint.

    Cite this