Loading [MathJax]/extensions/MathMenu.js
SegTex: A Large Scale Synthetic Face Dataset for Face Recognition | IEEE Journals & Magazine | IEEE Xplore

SegTex: A Large Scale Synthetic Face Dataset for Face Recognition


Overall flow of the SegTex Framework, aiming at transforming segmentation maps into textured images. The model initiates by processing six distinct images: segmentation m...

Abstract:

Face recognition remains challenged by data limitations in both scale and diversity, coupled with the ethical dilemmas of using images without the subjects’ consent. To a...Show More

Abstract:

Face recognition remains challenged by data limitations in both scale and diversity, coupled with the ethical dilemmas of using images without the subjects’ consent. To address these issues, this paper presents the SegTex framework, a cutting-edge method for generating synthetic face datasets by converting Segmentation maps into Textured images. Using the CelebAHQ-Mask dataset for segmentation maps and extracting facial features from the CelebAMask-HQ dataset, the SegTex method efficiently creates varied synthetic facial characteristics. This approach not only sidesteps the need for real-world data collection but also offers a rich and diverse dataset, essential for improving face recognition algorithm performance. In our experiments, models trained on the SegTex-generated dataset displayed superior performance metrics when compared to those trained on conventional datasets, underscoring the practical utility of our method. This robust performance, combined with the ethical advantages of synthetic data generation, ensures our approach holds significant importance in the field of face recognition.
Overall flow of the SegTex Framework, aiming at transforming segmentation maps into textured images. The model initiates by processing six distinct images: segmentation m...
Published in: IEEE Access ( Volume: 11)
Page(s): 131939 - 131949
Date of Publication: 23 November 2023
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.