Exploiting sparsity for real time video labelling

Lachlan Horne, Jose M. Alvarez, Nick Barnes

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    3 Citations (Scopus)

    Abstract

    Until recently, inference on fully connected graphs of pixel labels for scene understanding has been computationally expensive, so fast methods have focussed on neighbour connections and unary computation. However, with efficient CRF methods for inference on fully connected graphs, the opportunity exists for exploring other approaches. In this paper, we present a fast approach that calculates unary labels sparsely and relies on inference on fully connected graphs for label propagation. This reduces the unary computation which is now the most computationally expensive component. On a standard road scene dataset (CamVid), we show that accuarcy remains high when less than 0.15 percent of unary potentials are used. This achieves a reduction in computation by a factor of more than 750, with only small losses on global accuracy. This facilitates real-time processing on standard hardware that produces almost state-of-the-art results.

    Original languageEnglish
    Title of host publicationProceedings - 2013 IEEE International Conference on Computer Vision Workshops, ICCVW 2013
    PublisherInstitute of Electrical and Electronics Engineers Inc.
    Pages632-637
    Number of pages6
    ISBN (Print)9781479930227
    DOIs
    Publication statusPublished - 2013
    Event2013 14th IEEE International Conference on Computer Vision Workshops, ICCVW 2013 - Sydney, NSW, Australia
    Duration: 1 Dec 20138 Dec 2013

    Publication series

    NameProceedings of the IEEE International Conference on Computer Vision

    Conference

    Conference2013 14th IEEE International Conference on Computer Vision Workshops, ICCVW 2013
    Country/TerritoryAustralia
    CitySydney, NSW
    Period1/12/138/12/13

    Fingerprint

    Dive into the research topics of 'Exploiting sparsity for real time video labelling'. Together they form a unique fingerprint.

    Cite this