Object cut as minimum ratio cycle in a superpixel boundary graph

Gao Zhu, Yansheng Ming, Hongdong Li

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    1 Citation (Scopus)

    Abstract

    A category-specific object cut method is proposed in this paper that utilizes both minimum ratio cycle optimization and superpixel segmentation. This method can find a non-self-intersecting cycle in the image plane which aligns well with the outer boundary of an object instance. Most existing approaches under the minimum ratio cycle optimization framework are used for unsupervised image segmentation. Directly applying their approaches will cause orientation ambiguity which makes the globally minimal solution unachievable. It is demonstrated that a modification on top-down classification information can alleviate this difficulty even it does not hold for traditional linear-energy object cut methods. PASCAL VOC 2007 segmentation dataset is used for experimental evaluation and improved performance is obtained when our method is compared with other competitive object cut algorithms.

    Original languageEnglish
    Title of host publication2013 International Conference on Digital Image Computing
    Subtitle of host publicationTechniques and Applications, DICTA 2013
    DOIs
    Publication statusPublished - 2013
    Event2013 International Conference on Digital Image Computing: Techniques and Applications, DICTA 2013 - Hobart, TAS, Australia
    Duration: 26 Nov 201328 Nov 2013

    Publication series

    Name2013 International Conference on Digital Image Computing: Techniques and Applications, DICTA 2013

    Conference

    Conference2013 International Conference on Digital Image Computing: Techniques and Applications, DICTA 2013
    Country/TerritoryAustralia
    CityHobart, TAS
    Period26/11/1328/11/13

    Fingerprint

    Dive into the research topics of 'Object cut as minimum ratio cycle in a superpixel boundary graph'. Together they form a unique fingerprint.

    Cite this