Multi Task Generalization and Adaptation between Noisy Digit Datasets: An Empirical Study

Neural Information Processing Systems (NeurIPS), Workshop on Continual Learning, Montreal, Canada(2018)

引用 1|浏览43
暂无评分
摘要
Transfer learning for adaptation to new tasks is usually performed by either finetuning all model parameters or parameters in the final layers. We show that good target performance can also be achieved on typical domain adaptation tasks by adapting only the normalization statistics and affine transformations of layers throughout the network. We apply this adaptation scheme to supervised domain adaptation on common digit datasets and study robustness properties under perturbation by noise. Our results indicate that (1) adaptation to noise exceeds the difficulty of widely used digit benchmarks in domain adaptation,(2) the similarity of the optimal adaptation parameters for different domains is strongly predictive of generalization performance, and (3) generalization performance is highest with training on a rich environment or high noise levels.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要