Beyond Sharing Weights for Deep Domain Adaptation

A two-stream acrchitecture to expliclitly model the shift in features from one domain to the other

Released in: Beyond Sharing Weights for Deep Domain Adaptation

Source: Beyond Sharing Weights for Deep Domain Adaptation

Contributor:

Summary

The performance of a classifier trained on data coming from a specific domain typically degrades when applied to a related but different one. While annotating many samples from the new domain would address this issue, it is often too expensive or impractical. Domain Adaptation has  therefore emerged as a solution to this problem; It leverages annotated data from a source domain, in which it is abundant, to train a classifier to operate in a target domain, in which it is either sparse or even lacking altogether. In this context, the recent trend consists of learning deep  architectures whose weights are shared for both domains, which essentially amounts to learning domain invariant features.

Here, the author shows that it is more effective to explicitly model the shift from one domain to the other. To this end, author introduces a two-stream architecture, where one operates in the source domain and the other in the target domain. In contrast to other approaches, the weights in corresponding layers are related but not shared. We demonstrate that this both yields higher accuracy than state-of-the-art methods on several object recognition and detection tasks and consistently outperforms networks with shared weights in both supervised and unsupervised settings.

2016

Year Released

Key Links & Stats

Beyond Sharing Weights for Deep Domain Adaptation

Beyond Sharing Weights for Deep Domain Adaptation

@article{Rozantsev_2019, doi = {10.1109/tpami.2018.2814042}, url = {https://doi.org/10.1109%2Ftpami.2018.2814042}, year = 2019, month = {apr}, publisher = {Institute of Electrical and Electronics Engineers ({IEEE})}, volume = {41}, number = {4}, pages = {801--814}, author = {Artem Rozantsev and Mathieu Salzmann and Pascal Fua}, title = {Beyond Sharing Weights for Deep Domain Adaptation}, journal = {{IEEE} Transactions on Pattern Analysis and Machine Intelligence} }

ML Tasks

  1. Domain Adaptation

ML Platform

  1. Not Applicable

Modalities

  1. General

Verticals

  1. General

CG Platform

  1. Not Applicable

Related organizations

Computer Vision Laboratory, Ecole Polytechnique F ´ ed´ erale de Lausanne ´ Lausanne, Switzerland