Revisiting Batch Normalization For Practical Domain Adaptation

Adaptive Batch Normalization (AdaBN) to increase the generalization ability of a DNN for domain adaptation tasks.

Released in: Revisiting Batch Normalization For Practical Domain Adaptation

Source: Revisiting Batch Normalization For Practical Domain Adaptation

Contributor:

Summary

Author believes that, although deep neural networks (DNN) have shown unprecedented success in various computer vision applications such as image classification and object detection, it is still a common annoyance during the training phase, that one has to prepare at least thousands of labeled images to fine-tune a network to a specific domain. Recent study (Tommasi et al., 2015) shows that a DNN has strong dependency towards the training dataset, and the learned features cannot be easily transferred to a different but relevant task without fine-tuning.

In this paper, Author proposes a simple yet powerful remedy, called Adaptive Batch Normalization (AdaBN) to increase the generalization ability of a DNN. By modulating the statistics in all Batch Normalization layers across the network, our approach achieves deep adaptation effect for domain adaptation tasks. In contrary to other deep learning domain adaptation methods, our method does not require additional components, and is parameter-free. It archives state-of-the-art performance despite its surprising simplicity. Furthermore, we demonstrate that our method is complementary with other existing methods. Combining AdaBN with existing domain adaptation treatments may further improve model performance.

2017

Year Released

Key Links & Stats

Revisiting Batch Normalization For Practical Domain Adaptation

Revisiting Batch Normalization For Practical Domain Adaptation

@misc{https://doi.org/10.48550/arxiv.1603.04779, doi = {10.48550/ARXIV.1603.04779}, url = {https://arxiv.org/abs/1603.04779}, author = {Li, Yanghao and Wang, Naiyan and Shi, Jianping and Liu, Jiaying and Hou, Xiaodi}, keywords = {Computer Vision and Pattern Recognition (cs.CV), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Revisiting Batch Normalization For Practical Domain Adaptation}, publisher = {arXiv}, year = {2016}, copyright = {arXiv.org perpetual, non-exclusive license} }

ML Tasks

  1. Domain Adaptation

ML Platform

  1. Not Applicable

Modalities

  1. General

Verticals

  1. General

CG Platform

  1. Not Applicable

Related organizations

Institute of Computer Science and Technology, Peking University

TuSimple

SenseTime