Class-specific image deblurring

Saeed Anwar, Cong Phuoc Huynh, Fatih Porikli

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    26 Citations (Scopus)

    Abstract

    In image deblurring, a fundamental problem is that the blur kernel suppresses a number of spatial frequencies that are difficult to recover reliably. In this paper, we explore the potential of a class-specific image prior for recovering spatial frequencies attenuated by the blurring process. Specifically, we devise a prior based on the class-specific subspace of image intensity responses to band-pass filters. We learn that the aggregation of these subspaces across all frequency bands serves as a good class-specific prior for the restoration of frequencies that cannot be recovered with generic image priors. In an extensive validation, our method, equipped with the above prior, yields greater image quality than many state-of-the-art methods by up to 5 dB in terms of image PSNR, across various image categories including portraits, cars, cats, pedestrians and household objects.

    Original languageEnglish
    Title of host publication2015 International Conference on Computer Vision, ICCV 2015
    PublisherInstitute of Electrical and Electronics Engineers Inc.
    Pages495-503
    Number of pages9
    ISBN (Electronic)9781467383912
    DOIs
    Publication statusPublished - 17 Feb 2015
    Event15th IEEE International Conference on Computer Vision, ICCV 2015 - Santiago, Chile
    Duration: 11 Dec 201518 Dec 2015

    Publication series

    NameProceedings of the IEEE International Conference on Computer Vision
    Volume2015 International Conference on Computer Vision, ICCV 2015
    ISSN (Print)1550-5499

    Conference

    Conference15th IEEE International Conference on Computer Vision, ICCV 2015
    Country/TerritoryChile
    CitySantiago
    Period11/12/1518/12/15

    Fingerprint

    Dive into the research topics of 'Class-specific image deblurring'. Together they form a unique fingerprint.

    Cite this