PatchMatchGraph: Building a graph of dense patch correspondences for label transfer

Stephen Gould*, Yuhang Zhang

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    37 Citations (Scopus)

    Abstract

    We address the problem of semantic segmentation, or multi-class pixel labeling, by constructing a graph of dense overlapping patch correspondences across large image sets. We then transfer annotations from labeled images to unlabeled images using the established patch correspondences. Unlike previous approaches to non-parametric label transfer our approach does not require an initial image retrieval step. Moreover, we operate on a graph for computing mappings between images, which avoids the need for exhaustive pairwise comparisons. Consequently, we can leverage offline computation to enhance performance at test time. We conduct extensive experiments to analyze different variants of our graph construction algorithm and evaluate multi-class pixel labeling performance on several challenging datasets.

    Original languageEnglish
    Title of host publicationComputer Vision, ECCV 2012 - 12th European Conference on Computer Vision, Proceedings
    Pages439-452
    Number of pages14
    EditionPART 5
    DOIs
    Publication statusPublished - 2012
    Event12th European Conference on Computer Vision, ECCV 2012 - Florence, Italy
    Duration: 7 Oct 201213 Oct 2012

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    NumberPART 5
    Volume7576 LNCS
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349

    Conference

    Conference12th European Conference on Computer Vision, ECCV 2012
    Country/TerritoryItaly
    CityFlorence
    Period7/10/1213/10/12

    Fingerprint

    Dive into the research topics of 'PatchMatchGraph: Building a graph of dense patch correspondences for label transfer'. Together they form a unique fingerprint.

    Cite this