TY - JOUR
T1 - A Unified Approach for Conventional Zero-Shot, Generalized Zero-Shot, and Few-Shot Learning
AU - Rahman, Shafin
AU - Khan, Salman
AU - Porikli, Fatih
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/11
Y1 - 2018/11
N2 - Prevalent techniques in zero-shot learning do not generalize well to other related problem scenarios. Here, we present a unified approach for conventional zero-shot, generalized zero-shot, and few-shot learning problems. Our approach is based on a novel class adapting principal directions' (CAPDs) concept that allows multiple embeddings of image features into a semantic space. Given an image, our method produces one principal direction for each seen class. Then, it learns how to combine these directions to obtain the principal direction for each unseen class such that the CAPD of the test image is aligned with the semantic embedding of the true class and opposite to the other classes. This allows efficient and class-adaptive information transfer from seen to unseen classes. In addition, we propose an automatic process for the selection of the most useful seen classes for each unseen class to achieve robustness in zero-shot learning. Our method can update the unseen CAPD taking the advantages of few unseen images to work in a few-shot learning scenario. Furthermore, our method can generalize the seen CAPDs by estimating seen-unseen diversity that significantly improves the performance of generalized zero-shot learning. Our extensive evaluations demonstrate that the proposed approach consistently achieves superior performance in zero-shot, generalized zero-shot, and few/one-shot learning problems.
AB - Prevalent techniques in zero-shot learning do not generalize well to other related problem scenarios. Here, we present a unified approach for conventional zero-shot, generalized zero-shot, and few-shot learning problems. Our approach is based on a novel class adapting principal directions' (CAPDs) concept that allows multiple embeddings of image features into a semantic space. Given an image, our method produces one principal direction for each seen class. Then, it learns how to combine these directions to obtain the principal direction for each unseen class such that the CAPD of the test image is aligned with the semantic embedding of the true class and opposite to the other classes. This allows efficient and class-adaptive information transfer from seen to unseen classes. In addition, we propose an automatic process for the selection of the most useful seen classes for each unseen class to achieve robustness in zero-shot learning. Our method can update the unseen CAPD taking the advantages of few unseen images to work in a few-shot learning scenario. Furthermore, our method can generalize the seen CAPDs by estimating seen-unseen diversity that significantly improves the performance of generalized zero-shot learning. Our extensive evaluations demonstrate that the proposed approach consistently achieves superior performance in zero-shot, generalized zero-shot, and few/one-shot learning problems.
KW - Zero-shot learning
KW - class adaptive principal direction
KW - few-shot learning
KW - generalized zero-shot learning
UR - http://www.scopus.com/inward/record.url?scp=85050991166&partnerID=8YFLogxK
U2 - 10.1109/TIP.2018.2861573
DO - 10.1109/TIP.2018.2861573
M3 - Article
SN - 1057-7149
VL - 27
SP - 5652
EP - 5667
JO - IEEE Transactions on Image Processing
JF - IEEE Transactions on Image Processing
IS - 11
M1 - 8423721
ER -