TY - JOUR
T1 - HOT-GAN
T2 - Hilbert Optimal Transport for Generative Adversarial Network
AU - Li, Qian
AU - Wang, Zhichao
AU - Xia, Haiyang
AU - Li, Gang
AU - Cao, Yanan
AU - Yao, Lina
AU - Xu, Guandong
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2025/3
Y1 - 2025/3
N2 - Generative adversarial network (GAN) has achieved remarkable success in generating high-quality synthetic data by learning the underlying distributions of target data. Recent efforts have been devoted to utilizing optimal transport (OT) to tackle the gradient vanishing and instability issues in GAN. They use the Wasserstein distance as a metric to measure the discrepancy between the generator distribution and the real data distribution. However, most optimal transport GANs define loss functions in Euclidean space, which limits their capability in handling high-order statistics that are of much interest in a variety of practical applications. In this article, we propose a computational framework to alleviate this issue from both theoretical and practical perspectives. Particularly, we generalize the optimal transport-based GAN from Euclidean space to the reproducing kernel Hilbert space (RKHS) and propose Hilbert Optimal Transport GAN (HOT-GAN). First, we design HOT-GAN with a Hilbert embedding that allows the discriminator to tackle more informative and high-order statistics in RKHS. Second, we prove that HOT-GAN has a closed-form kernel reformulation in RKHS that can achieve a tractable objective under the GAN framework. Third, HOT-GAN's objective enjoys the theoretical guarantee of differentiability with respect to generator parameters, which is beneficial to learn powerful generators via adversarial kernel learning. Extensive experiments are conducted, showing that our proposed HOT-GAN consistently outperforms the representative GAN works.
AB - Generative adversarial network (GAN) has achieved remarkable success in generating high-quality synthetic data by learning the underlying distributions of target data. Recent efforts have been devoted to utilizing optimal transport (OT) to tackle the gradient vanishing and instability issues in GAN. They use the Wasserstein distance as a metric to measure the discrepancy between the generator distribution and the real data distribution. However, most optimal transport GANs define loss functions in Euclidean space, which limits their capability in handling high-order statistics that are of much interest in a variety of practical applications. In this article, we propose a computational framework to alleviate this issue from both theoretical and practical perspectives. Particularly, we generalize the optimal transport-based GAN from Euclidean space to the reproducing kernel Hilbert space (RKHS) and propose Hilbert Optimal Transport GAN (HOT-GAN). First, we design HOT-GAN with a Hilbert embedding that allows the discriminator to tackle more informative and high-order statistics in RKHS. Second, we prove that HOT-GAN has a closed-form kernel reformulation in RKHS that can achieve a tractable objective under the GAN framework. Third, HOT-GAN's objective enjoys the theoretical guarantee of differentiability with respect to generator parameters, which is beneficial to learn powerful generators via adversarial kernel learning. Extensive experiments are conducted, showing that our proposed HOT-GAN consistently outperforms the representative GAN works.
KW - Discrepancy minimization
KW - generative adversarial network (GAN)
KW - gradient vanishing
KW - optimal transport (OT)
KW - reproducing kernel Hilbert space (RKHS)
UR - http://www.scopus.com/inward/record.url?scp=86000430454&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2024.3370617
DO - 10.1109/TNNLS.2024.3370617
M3 - Article
C2 - 38833390
AN - SCOPUS:86000430454
SN - 2162-237X
VL - 36
SP - 4371
EP - 4384
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 3
ER -