Abstract:
Generative adversarial network (GAN) has achieved remarkable success in generating high-quality synthetic data by learning the underlying distributions of target data. Re...Show MoreMetadata
Abstract:
Generative adversarial network (GAN) has achieved remarkable success in generating high-quality synthetic data by learning the underlying distributions of target data. Recent efforts have been devoted to utilizing optimal transport (OT) to tackle the gradient vanishing and instability issues in GAN. They use the Wasserstein distance as a metric to measure the discrepancy between the generator distribution and the real data distribution. However, most optimal transport GANs define loss functions in Euclidean space, which limits their capability in handling high-order statistics that are of much interest in a variety of practical applications. In this article, we propose a computational framework to alleviate this issue from both theoretical and practical perspectives. Particularly, we generalize the optimal transport-based GAN from Euclidean space to the reproducing kernel Hilbert space (RKHS) and propose Hilbert Optimal Transport GAN (HOT-GAN). First, we design HOT-GAN with a Hilbert embedding that allows the discriminator to tackle more informative and high-order statistics in RKHS. Second, we prove that HOT-GAN has a closed-form kernel reformulation in RKHS that can achieve a tractable objective under the GAN framework. Third, HOT-GAN’s objective enjoys the theoretical guarantee of differentiability with respect to generator parameters, which is beneficial to learn powerful generators via adversarial kernel learning. Extensive experiments are conducted, showing that our proposed HOT-GAN consistently outperforms the representative GAN works.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 36, Issue: 3, March 2025)
Funding Agency:
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Generative Adversarial Networks ,
- Optimal Transport ,
- Loss Function ,
- Data Distribution ,
- Variety Of Applications ,
- Hilbert Space ,
- Euclidean Space ,
- Target Data ,
- Real Distribution ,
- Diverse Interests ,
- Vanishing Gradient ,
- Theoretical Guarantees ,
- Reproducing Kernel Hilbert Space ,
- Higher-order Statistics ,
- Real Data Distribution ,
- Generative Adversarial Network Framework ,
- Magnetic Resonance Imaging ,
- Objective Function ,
- Statistical Distribution ,
- Wasserstein Generative Adversarial Networks ,
- Maximum Mean Discrepancy ,
- Generative Adversarial Networks Training ,
- Fréchet Inception Distance ,
- Jensen-Shannon Divergence ,
- Earth Mover’s Distance ,
- Conditional Generative Adversarial Network ,
- Generative Adversarial Networks Model ,
- Finite-dimensional Space ,
- Vanishing Gradient Problem
- Author Keywords
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Generative Adversarial Networks ,
- Optimal Transport ,
- Loss Function ,
- Data Distribution ,
- Variety Of Applications ,
- Hilbert Space ,
- Euclidean Space ,
- Target Data ,
- Real Distribution ,
- Diverse Interests ,
- Vanishing Gradient ,
- Theoretical Guarantees ,
- Reproducing Kernel Hilbert Space ,
- Higher-order Statistics ,
- Real Data Distribution ,
- Generative Adversarial Network Framework ,
- Magnetic Resonance Imaging ,
- Objective Function ,
- Statistical Distribution ,
- Wasserstein Generative Adversarial Networks ,
- Maximum Mean Discrepancy ,
- Generative Adversarial Networks Training ,
- Fréchet Inception Distance ,
- Jensen-Shannon Divergence ,
- Earth Mover’s Distance ,
- Conditional Generative Adversarial Network ,
- Generative Adversarial Networks Model ,
- Finite-dimensional Space ,
- Vanishing Gradient Problem
- Author Keywords