Loading [MathJax]/extensions/MathMenu.js
Learning Smooth Representation for Unsupervised Domain Adaptation | IEEE Journals & Magazine | IEEE Xplore

Learning Smooth Representation for Unsupervised Domain Adaptation


Abstract:

Typical adversarial-training-based unsupervised domain adaptation (UDA) methods are vulnerable when the source and target datasets are highly complex or exhibit a large d...Show More

Abstract:

Typical adversarial-training-based unsupervised domain adaptation (UDA) methods are vulnerable when the source and target datasets are highly complex or exhibit a large discrepancy between their data distributions. Recently, several Lipschitz-constraint-based methods have been explored. The satisfaction of Lipschitz continuity guarantees a remarkable performance on a target domain. However, they lack a mathematical analysis of why a Lipschitz constraint is beneficial to UDA and usually perform poorly on large-scale datasets. In this article, we take the principle of utilizing a Lipschitz constraint further by discussing how it affects the error bound of UDA. A connection between them is built, and an illustration of how Lipschitzness reduces the error bound is presented. A local smooth discrepancy is defined to measure the Lipschitzness of a target distribution in a pointwise way. When constructing a deep end-to-end model, to ensure the effectiveness and stability of UDA, three critical factors are considered in our proposed optimization strategy, i.e., the sample amount of a target domain, dimension, and batchsize of samples. Experimental results demonstrate that our model performs well on several standard benchmarks. Our ablation study shows that the sample amount of a target domain, the dimension, and batchsize of samples, indeed, greatly impact Lipschitz-constraint-based methods’ ability to handle large-scale datasets. Code is available at https://github.com/CuthbertCai/SRDA.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 34, Issue: 8, August 2023)
Page(s): 4181 - 4195
Date of Publication: 17 November 2021

ISSN Information:

PubMed ID: 34788221

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.