This figure shows key GAN enhancements: an optimized architecture from Neural Architecture Search using MMD-GAN repulsive loss (top), the Parametric Mish (PMish) activati...
Abstract:
Generative Adversarial Networks (GANs) have gained considerable attention owing to their impressive ability to generate high-quality, realistic images from a desired data...Show MoreMetadata
Abstract:
Generative Adversarial Networks (GANs) have gained considerable attention owing to their impressive ability to generate high-quality, realistic images from a desired data distribution. This research introduces advancements in GANs by developing an improved activation function, a novel training strategy, and an adaptive rank decomposition method to compress the network. The proposed activation function, called Parametric Mish (PMish), automatically adjusts a trainable parameter to control the smoothness and shape of the activation function. Our method employs a Neural Architecture Search (NAS) to discover the optimal architecture for image generation while using the Maximum Mean Discrepancy (MMD) repulsive loss for adversarial training. The proposed novel training strategy improves performance by progressively increasing the upper bound of the bounded MMD-GAN repulsive loss. Finally, the proposed Adaptive Rank Decomposition (ARD) method reduces the complexity of the network with minimal impact on its generative performance, thus enabling efficient deployment on resource-limited platforms. The effectiveness of these advancements is rigorously tested on standard benchmark datasets such as CIFAR-10, CIFAR-100, STL-10, and CelebA, where significant improvements over existing techniques are demonstrated. The implementation code is available at: https://github.com/PrasannaPulakurthi/MMD-PMish-NAS
This figure shows key GAN enhancements: an optimized architecture from Neural Architecture Search using MMD-GAN repulsive loss (top), the Parametric Mish (PMish) activati...
Published in: IEEE Access ( Volume: 12)