Robust imagegraph: Rank-level feature fusion for image search

Ziqiong Liu, Shengjin Wang*, Liang Zheng, Qi Tian

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

36 Citations (Scopus)

Abstract

Recently, feature fusion has demonstrated its effectiveness in image search. However, bad features and inappropriate parameters usually bring about false positive images, i.e., outliers, leading to inferior performance. Therefore, a major challenge of fusion scheme is how to be robust to outliers. Towards this goal, this paper proposes a rank-level framework for robust feature fusion. First, we define Rank Distance to measure the relevance of images at rank level. Based on it, Bayes similarity is introduced to evaluate the retrieval quality of individual features, through which true matches tend to obtain higher weight than outliers. Then, we construct the directed ImageGraph to encode the relationship of images. Each image is connected to its K nearest neighbors with an edge, and the edge is weighted by Bayes similarity. Multiple rank lists resulted from different methods are merged via ImageGraph. Furthermore, on the fused ImageGraph, local ranking is performed to re-order the initial rank lists. It aims at local optimization, and thus is more robust to global outliers. Extensive experiments on four benchmark data sets validate the effectiveness of our method. Besides, the proposed method outperforms two popular fusion schemes, and the results are competitive to the state-of-the-art.

Original languageEnglish
Article number7835116
Pages (from-to)3128-3141
Number of pages14
JournalIEEE Transactions on Image Processing
Volume26
Issue number7
DOIs
Publication statusPublished - Jul 2017
Externally publishedYes

Fingerprint

Dive into the research topics of 'Robust imagegraph: Rank-level feature fusion for image search'. Together they form a unique fingerprint.

Cite this