Etd

An Empirical Study of Domain Adaptation: Are We Really Learning Transferable Representations?

Public Deposited

Downloadable Content

open in viewer

Deep learning often relies on the availability of a large amount of high-quality labeled data, which can be very limited in novel domains. To address such data scarcity, domain adaptation is one promising approach that allows for deep networks to leverage large amounts of available data from a source domain to enhance the model’s efficacy on the target domain of interest. However, while there is a plethora of alternate models for domain adaptation proposed over many years in the literature, there is a dearth of studies that objectively compare the relative effectiveness of these models in a rigorous, empirical study. To fill this gap, we provide a thorough, unbiased, empirical study of five state-of-the-art (SOTA) deep domain adaptation models proposed over the past 6 years whose codes are publicly available. Models are evaluated on the complex and diverse domain adaptation tasks featured in the DomainNet benchmark dataset as well as the popular Office-31 dataset. Our results suggest that (1) all 5 models perform similarly, on average, and do not even significantly beat the oldest model, and (2) counter to their intended purpose, the transfer loss functions in the literature do not contribute significantly to learning transferable representations. Our observations suggest that domain adaptation research needs to more thoroughly compare newly proposed models against existing works, along with assessing their loss functions’ utility thoroughly. We will release our code and data splits for reproducibility of results by the community.

Creator
Contributors
Degree
Unit
Publisher
Identifier
  • etd-78836
Keyword
Advisor
Defense date
Year
  • 2022
Sponsor
Date created
  • 2022-10-10
Resource type
Source
  • etd-78836
Rights statement
Last modified
  • 2023-12-05

Relations

In Collection:

Items

Items

Permanent link to this page: https://digital.wpi.edu/show/2801pk87k