TY - JOUR
T1 - Classification and boosting with multiple collaborative representations
AU - Chi, Yuejie
AU - Porikli, Fatih
PY - 2014/8
Y1 - 2014/8
N2 - Recent advances have shown a great potential to explore collaborative representations of test samples in a dictionary composed of training samples from all classes in multi-class recognition including sparse representations. In this paper, we present two multi-class classification algorithms that make use of multiple collaborative representations in their formulations, and demonstrate performance gain of exploring this extra degree of freedom. We first present the Collaborative Representation Optimized Classifier (CROC), which strikes a balance between the nearest-subspace classifier, which assigns a test sample to the class that minimizes the distance between the sample and its principal projection in the selected class, and a Collaborative Representation based Classifier (CRC), which assigns a test sample to the class that minimizes the distance between the sample and its collaborative components. Several well-known classifiers become special cases of CROC under different regularization parameters. We show classification performance can be improved by optimally tuning the regularization parameter through cross validation. We then propose the Collaborative Representation based Boosting (CRBoosting) algorithm, which generalizes the CROC to incorporate multiple collaborative representations. Extensive numerical examples are provided with performance comparisons of different choices of collaborative representations, in particular when the test sample is available via compressive measurements.
AB - Recent advances have shown a great potential to explore collaborative representations of test samples in a dictionary composed of training samples from all classes in multi-class recognition including sparse representations. In this paper, we present two multi-class classification algorithms that make use of multiple collaborative representations in their formulations, and demonstrate performance gain of exploring this extra degree of freedom. We first present the Collaborative Representation Optimized Classifier (CROC), which strikes a balance between the nearest-subspace classifier, which assigns a test sample to the class that minimizes the distance between the sample and its principal projection in the selected class, and a Collaborative Representation based Classifier (CRC), which assigns a test sample to the class that minimizes the distance between the sample and its collaborative components. Several well-known classifiers become special cases of CROC under different regularization parameters. We show classification performance can be improved by optimally tuning the regularization parameter through cross validation. We then propose the Collaborative Representation based Boosting (CRBoosting) algorithm, which generalizes the CROC to incorporate multiple collaborative representations. Extensive numerical examples are provided with performance comparisons of different choices of collaborative representations, in particular when the test sample is available via compressive measurements.
KW - Classifier design and evaluation
KW - Design Methodology
KW - Feature evaluation and selection
KW - Pattern analysis
UR - http://www.scopus.com/inward/record.url?scp=84904214211&partnerID=8YFLogxK
U2 - 10.1109/TPAMI.2013.236
DO - 10.1109/TPAMI.2013.236
M3 - Article
SN - 0162-8828
VL - 36
SP - 1519
EP - 1531
JO - IEEE Transactions on Pattern Analysis and Machine Intelligence
JF - IEEE Transactions on Pattern Analysis and Machine Intelligence
IS - 8
M1 - 6678501
ER -