Superpixel graph label transfer with learned distance metric

Stephen Gould, Jiecheng Zhao, Xuming He, Yuhang Zhang

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    36 Citations (Scopus)

    Abstract

    We present a fast approximate nearest neighbor algorithm for semantic segmentation. Our algorithm builds a graph over superpixels from an annotated set of training images. Edges in the graph represent approximate nearest neighbors in feature space. At test time we match superpixels from a novel image to the training images by adding the novel image to the graph. A move-making search algorithm allows us to leverage the graph and image structure for finding matches. We then transfer labels from the training images to the image under test. To promote good matches between superpixels we propose to learn a distance metric that weights the edges in our graph. Our approach is evaluated on four standard semantic segmentation datasets and achieves results comparable with the state-of-the-art.

    Original languageEnglish
    Title of host publicationComputer Vision, ECCV 2014 - 13th European Conference, Proceedings
    PublisherSpringer Verlag
    Pages632-647
    Number of pages16
    EditionPART 1
    ISBN (Print)9783319105895
    DOIs
    Publication statusPublished - 2014
    Event13th European Conference on Computer Vision, ECCV 2014 - Zurich, Switzerland
    Duration: 6 Sept 201412 Sept 2014

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    NumberPART 1
    Volume8689 LNCS
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349

    Conference

    Conference13th European Conference on Computer Vision, ECCV 2014
    Country/TerritorySwitzerland
    CityZurich
    Period6/09/1412/09/14

    Fingerprint

    Dive into the research topics of 'Superpixel graph label transfer with learned distance metric'. Together they form a unique fingerprint.

    Cite this