Generative Dataset Distillation Based on Large Model Pool | IEEE Conference Publication | IEEE Xplore

Generative Dataset Distillation Based on Large Model Pool


Abstract:

Distilling knowledge into generative models is a technique to extract knowledge from a large dataset and embed it into a generative model, which effectively reduces redep...Show More

Abstract:

Distilling knowledge into generative models is a technique to extract knowledge from a large dataset and embed it into a generative model, which effectively reduces redeployment costs and boosts dataset distillation efficiency. In this paper, we propose a novel approach to enhance the performance of generative models by expanding the size of the model pool. Our approach strives to improve the diversity of models in the model pool so that more distinctive information can be considered when matching the prediction logits. We also verified the proposed method is effective on the CIFAR-10 dataset.
Date of Conference: 29 October 2024 - 01 November 2024
Date Added to IEEE Xplore: 28 November 2024
ISBN Information:

ISSN Information:

Conference Location: Kitakyushu, Japan

Contact IEEE to Subscribe

References

References is not available for this document.